mean field variational bayes
Linear Response Methods for Accurate Covariance Estimates from Mean Field Variational Bayes
Mean field variational Bayes (MFVB) is a popular posterior approximation method due to its fast runtime on large-scale data sets. However, a well known failing of MFVB is that it underestimates the uncertainty of model variables (sometimes severely) and provides no information about model variable covariance. We generalize linear response methods from statistical physics to deliver accurate uncertainty estimates for model variables---both for individual variables and coherently across variables. We call our method linear response variational Bayes (LRVB). When the MFVB posterior approximation is in the exponential family, LRVB has a simple, analytic form, even for non-conjugate models. Indeed, we make no assumptions about the form of the true posterior. We demonstrate the accuracy and scalability of our method on a range of models for both simulated and real data.
Particle Mean Field Variational Bayes
Tran, Minh-Ngoc, Tseng, Paco, Kohn, Robert
To solve this problem, there are two main classes of computational methods that provide different approaches to approximate π. The first one is Markov chain Monte Carlo (MCMC) methods (Metropolis et al., 1953; Hastings, 1970; Robert and Casella, 1999). For many years, MCMC has been the standard approach for Bayesian analysis because of its theoretical soundness. The method constructs a Markov chain to produce simulation consistent samples from the target distribution π. A general MCMC approach is the Metropolis-Hastings algorithm that generates a Markov chain by first generating a proposed state from a proposal distribution, then using an acceptance rule to decide whether to accept the proposal or stay at the current state (Robert and Casella, 1999).
- Government (0.67)
- Energy > Oil & Gas > Upstream (0.46)
- Information Technology > Artificial Intelligence > Machine Learning > Statistical Learning (1.00)
- Information Technology > Artificial Intelligence > Machine Learning > Learning Graphical Models > Undirected Networks > Markov Models (0.74)
- Information Technology > Artificial Intelligence > Representation & Reasoning > Uncertainty > Bayesian Inference (0.68)
- (2 more...)
Density Estimation via Bayesian Inference Engines
Bayesian inference engines have become established as an important paradigm for inference in arbitrarily large and complex graphical models. Software platforms such as Infer.NET (Minka et al., 2018) and Stan (Carpenter et al., 2017) are instances of such Bayesian inference engines. They deliver approximate Bayesian inference, with varying degrees of inferential accuracy, by calling upon contemporary approaches such as expectation propagation, Hamiltonian Monte Carlo and variational approximation. The purpose of this short article is to show that effective and scalable probability density function estimation, or density estimation for short, can be achieved using Bayesian inference engines. We provide easy access for users of the R statistical computing environment (R Core Team, 2018) via a package named densEstBayes (Wand, 2020).
- Europe > Austria > Vienna (0.14)
- Europe > United Kingdom (0.04)
- Oceania > New Zealand (0.04)
- (4 more...)
Linear Response Methods for Accurate Covariance Estimates from Mean Field Variational Bayes
Giordano, Ryan J., Broderick, Tamara, Jordan, Michael I.
Mean field variational Bayes (MFVB) is a popular posterior approximation method due to its fast runtime on large-scale data sets. However, a well known failing of MFVB is that it underestimates the uncertainty of model variables (sometimes severely) and provides no information about model variable covariance. We generalize linear response methods from statistical physics to deliver accurate uncertainty estimates for model variables---both for individual variables and coherently across variables. We call our method linear response variational Bayes (LRVB). When the MFVB posterior approximation is in the exponential family, LRVB has a simple, analytic form, even for non-conjugate models.
Streamlined Variational Inference for Linear Mixed Models with Crossed Random Effects
Menictas, Marianne, Di Credico, Gioia, Wand, Matt P.
W e derive streamlined mean field variational Bayes algorithms for fitting linear mixed models with crossed random effects. In the most general situation, where the dimensions of the crossed groups are arbitrarily large, streamlining is hindered by lack of sparseness in the underlying least squares system. Because of this fact we also consider a hierarchy of relaxations of the mean field product restriction. The least stringent product restriction delivers a high degree of inferential accuracy . However, this accuracy must be mitigated against its higher storage and computing demands. Faster sparse storage and computing alternatives are also provided, but come with the price of diminished inferential accuracy . This article provides full algorithmic details of three variational inference strategies, presents detailed empirical results on their pros and cons and, thus, guides the users on their choice of variational inference approach depending on the problem size and computing resources. Keywords: Mean field variational Bayes; item response theory; Rasch analysis; scalable statistical methodology; sparse least squares systems.
- Europe > Austria > Vienna (0.14)
- Europe > United Kingdom > North Sea > Southern North Sea (0.04)
- Information Technology > Artificial Intelligence > Machine Learning > Statistical Learning (0.68)
- Information Technology > Artificial Intelligence > Representation & Reasoning > Uncertainty > Bayesian Inference (0.48)
- Information Technology > Artificial Intelligence > Machine Learning > Learning Graphical Models > Directed Networks > Bayesian Learning (0.34)